The QR decomposition for radial neural networks.
Iordan Ganev (Weizmann)
Abstract: We present a perspective on neural networks stemming from quiver representation theory. This point of view emphasizes the symmetries inherent in neural networks, interacts nicely with gradient descent, and has the potential to improve training algorithms. As an application, we establish an analog of the QR decomposition for radial neural networks, which leads to a dimensional reduction result. This talk is intended for an audience with a background in representation theory; we explain all concepts relating to neural networks and machine learning from first principles. It is based on joint work with Robin Walters.
mathematical physicsalgebraic geometrycategory theoryrepresentation theory
Audience: researchers in the topic
UMass Amherst Representation theory seminar
| Organizers: | Tina Kanstrup*, Chris Elliott |
| *contact for this listing |
